23 research outputs found

    The TA Framework: Designing Real-time Teaching Augmentation for K-12 Classrooms

    Full text link
    Recently, the HCI community has seen increased interest in the design of teaching augmentation (TA): tools that extend and complement teachers' pedagogical abilities during ongoing classroom activities. Examples of TA systems are emerging across multiple disciplines, taking various forms: e.g., ambient displays, wearables, or learning analytics dashboards. However, these diverse examples have not been analyzed together to derive more fundamental insights into the design of teaching augmentation. Addressing this opportunity, we broadly synthesize existing cases to propose the TA framework. Our framework specifies a rich design space in five dimensions, to support the design and analysis of teaching augmentation. We contextualize the framework using existing designs cases, to surface underlying design trade-offs: for example, balancing actionability of presented information with teachers' needs for professional autonomy, or balancing unobtrusiveness with informativeness in the design of TA systems. Applying the TA framework, we identify opportunities for future research and design.Comment: to be published in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 17 pages, 10 figure

    Aligning the goals of learning analytics with its research scholarship: an open peer commentary approach

    Get PDF
    To promote cross-community dialogue on matters of significance within the field of learning analytics (LA), we as editors-in-chief of the Journal of Learning Analytics (JLA) have introduced a section for papers that are open to peer commentary. An invitation to submit proposals for commentaries on the paper was released, and 12 of these proposals were accepted. The 26 authors of the accepted commentaries are based in Europe, North America, and Australia. They range in experience from PhD students and early-career researchers to some of the longest-standing, most senior members of the learning analytics community. This paper brings those commentaries together, and we recommend reading it as a companion piece to the original paper by Motz et al. (2023), which also appears in this issue.Horizon 2020(H2020)883588Algorithms and the Foundations of Software technolog

    The Dashboard That Loved Me:Designing adaptive learning analytics for self-regulated learning

    No full text
    Learning dashboards are learning analytics (LA) tools built to make learners aware of their learning performance and behaviour and supporting self-reflection. However, most of the existing dashboards follow a “one size fits all” philosophy disregarding individual differences between learners, e.g., differences that stem from diverse cultural backgrounds, different motivations for learning or different levels of self-regulated learning (SRL) skills. In this thesis, we challenge the assumption that impactful learning analytics should be limited to making learners aware of their learning, but rather should encourage and support learners in taking action and changing their learning behaviour. We thus take a learner-centred approach and explore what information learners choose to receive on learning analytics dashboards and how this choice relates to their learning motivation and their SRL skills. We also investigate how dashboard designs support learners in making sense of the displayed information and how learner goals and level of SRL skills influence what learners find relevant on such interfaces. The large-scale experiments conducted with both higher education students and with MOOC learners bring empirical evidence as to how aligning the design of learning analytics dashboards with the learners’ intentions, learning motivation and the level of self-regulated learning skills influences the uptake and impact of such tools. The outcomes of these studies are synthesised in eleven recommendations for learning analytics dashboard design grouped according to the phase of the dashboard life-cycle to which they apply: (i) methodological aspects to be considered before designing dashboards, (ii) design requirements to be considered during the design phase and (iii) support offered to learners after the dashboard has been rolled out

    The Learning Tracker: A Learner Dashboard that Encourages Self-regulation in MOOC Learners

    No full text
    Massive Open Online Courses (MOOCs) have the potential to make quality education affordable and available to the masses and reduce the gap between the most privileged and the most disadvantaged learners worldwide. However, this potential is overshadowed by low completion rates, often below 15%. Due to the high level of autonomy that is required when learning with a MOOC, literature identifies limited self-regulated learning skills as one of the causes that lead to early dropouts in MOOCs. Moreover, existing tools designed to aid learners in the online learning environment fail to provide the support needed for the development of such skills. The aim of the present work is to bridge this gap by investigating how self-regulated learning skills can be enhanced by encouraging metacognition and reflection in MOOC learners by means of social comparison. To this end, following an iterative process, we have developed the Learning Tracker, an interactive widget which allows learners to visualise their learning behaviour and compare it to that of previous graduates of the same MOOC. Each iteration was extensively evaluated in live TU Delft MOOCs running on the edX platform while engaging over 20.000 MOOC learners over the whole duration of each MOOC. Our results show that learners that have access to the Learning Tracker are more likely to graduate the MOOC. Moreover, we have observed that the widget has a positive impact on learners' engagement and reduces procrastination. However, we have little evidence that learners improved their self-regulated learning skills by the end of the MOOCs. Based on our results, we argue that the mere fact of receiving feedback on a limited number of learning habits could trigger self-reflection in learners and lead to improved learner performance. This work underlines the powerful effect feedback and self-reflection on one's behaviour has on learning performance. We recommend that future research should investigate learners' feedback literacy and devise effective ways of presenting learners with personalised feedback based on their goals, learning skill level and cultural background.Electrical Engineering, Mathematics and Computer ScienceSoftware Technolog

    The Dashboard That Loved Me: Designing adaptive learning analytics for self-regulated learning

    No full text
    Learning dashboards are learning analytics (LA) tools built to make learners aware of their learning performance and behaviour and supporting self-reflection. However, most of the existing dashboards follow a “one size fits all” philosophy disregarding individual differences between learners, e.g., differences that stem from diverse cultural backgrounds, different motivations for learning or different levels of self-regulated learning (SRL) skills. In this thesis, we challenge the assumption that impactful learning analytics should be limited to making learners aware of their learning, but rather should encourage and support learners in taking action and changing their learning behaviour. We thus take a learner-centred approach and explore what information learners choose to receive on learning analytics dashboards and how this choice relates to their learning motivation and their SRL skills. We also investigate how dashboard designs support learners in making sense of the displayed information and how learner goals and level of SRL skills influence what learners find relevant on such interfaces. The large-scale experiments conducted with both higher education students and with MOOC learners bring empirical evidence as to how aligning the design of learning analytics dashboards with the learners’ intentions, learning motivation and the level of self-regulated learning skills influences the uptake and impact of such tools. The outcomes of these studies are synthesised in eleven recommendations for learning analytics dashboard design grouped according to the phase of the dashboard life-cycle to which they apply: (i) methodological aspects to be considered before designing dashboards, (ii) design requirements to be considered during the design phase and (iii) support offered to learners after the dashboard has been rolled out

    The effect of the COVID-19 pandemic on a MOOC in Aerospace Structures and Materials

    No full text
    In March 2020 COVID-19 brought the world and with that aviation to a standstill. Also in March 2020, the third run of the DelftX MOOC Introduction to AerospaceStructures and Materials started on edX. This MOOC generally attracts a mixture of young aviation enthusiasts (often students) and aviation professionals. Given the large interest MOOCs have received as the pandemic hit, we investigate how the new global context affected the motivation and the way learners interact with our course material. For this project, we will use learning analytics approaches toanalyse the log data available from the edX platform and the data from pre- andpost-course evaluations of two runs of the same MOOC (2019 and 2020).With the insights gathered through this analysis, we wish to better understand our learners and adjust the learning design of the course to better suit their needs. Our paper will present the first insights of this analysis.Web Information SystemsAerospace Structures & Material

    From students with love: An empirical study on learner goals, self-regulated learning and sense-making of learning analytics in higher education

    No full text
    Unequal stakeholder engagement is a common pitfall of adoption approaches of learning analytics in higher education leading to lower buy-in and flawed tools that fail to meet the needs of their target groups. With each design decision, we make assumptions on how learners will make sense of the visualisations, but we know very little about how students make sense of dashboard and which aspects influence their sense-making. We investigated how learner goals and self-regulated learning (SRL) skills influence dashboard sense-making following a mixed-methods research methodology: a qualitative pre-study followed-up with an extensive quantitative study with 247 university students. We uncovered three latent variables for sense-making: transparency of design, reference frames and support for action. SRL skills are predictors for how relevant students find these constructs. Learner goals have a significant effect only on the perceived relevance of reference frames. Knowing which factors influence students' sense-making will lead to more inclusive and flexible designs that will cater to the needs of both novice and expert learners.</p

    Quantum of Choice:How learners’ feedback monitoring decisions, goals and self-regulated learning skills are related

    No full text
    Learning analytics dashboards (LADs) are designed as feedback tools for learners, but until recently, learners rarely have had a say in how LADs are designed and what information they receive through LADs. To overcome this shortcoming, we have developed a customisable LAD for Coursera MOOCs on which learners can set goals and choose indicators to monitor. Following a mixed-methods approach, we analyse 401 learners’ indicator selection behaviour in order to understand the decisions they make on the LAD and whether learner goals and self-regulated learning skills influence these decisions. We found that learners overwhelmingly chose indicators about completed activities. Goals are not associated with indicator selection behaviour, while help-seeking skills predict learners’ choice of monitoring their engagement in discussions and time management skills predict learners’ interest in procrastination indicators. The findings have implications for our understanding of learners’ use of LADs and their design
    corecore